Assessing Lexical-Semantic Regularities in Portuguese Word Embeddings
نویسندگان
چکیده
منابع مشابه
Reasoning about Linguistic Regularities in Word Embeddings using Matrix Manifolds
Recent work has explored methods for learning continuous vector space word representations reflecting the underlying semantics of words. Simple vector space arithmetic using cosine distances has been shown to capture certain types of analogies, such as reasoning about plurals from singulars, past tense from present tense, etc. In this paper, we introduce a new approach to capture analogies in c...
متن کاملLexical Coherence Graph Modeling Using Word Embeddings
Coherence is established by semantic connections between sentences of a text which can be modeled by lexical relations. In this paper, we introduce the lexical coherence graph (LCG), a new graph-based model to represent lexical relations among sentences. The frequency of subgraphs (coherence patterns) of this graph captures the connectivity style of sentence nodes in this graph. The coherence o...
متن کاملDict2vec : Learning Word Embeddings using Lexical Dictionaries
Learning word embeddings on large unlabeled corpus has been shown to be successful in improving many natural language tasks. The most efficient and popular approaches learn or retrofit such representations using additional external data. Resulting embeddings are generally better than their corpus-only counterparts, although such resources cover a fraction of words in the vocabulary. In this pap...
متن کاملTemporal Word Analogies: Identifying Lexical Replacement with Diachronic Word Embeddings
This paper introduces the concept of temporal word analogies: pairs of words which occupy the same semantic space at different points in time. One well-known property of word embeddings is that they are able to effectively model traditional word analogies (“word w1 is to word w2 as word w3 is to word w4”) through vector addition. Here, I show that temporal word analogies (“wordw1 at time tα is ...
متن کاملImproving Lexical Embeddings with Semantic Knowledge
Word embeddings learned on unlabeled data are a popular tool in semantics, but may not capture the desired semantics. We propose a new learning objective that incorporates both a neural language model objective (Mikolov et al., 2013) and prior knowledge from semantic resources to learn improved lexical semantic embeddings. We demonstrate that our embeddings improve over those learned solely on ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Interactive Multimedia and Artificial Intelligence
سال: 2021
ISSN: ['1989-1660']
DOI: https://doi.org/10.9781/ijimai.2021.02.006